Locations we serve
Locations we serve
Locations we serve
Divorce
Divorce
Divorce
Other Services
Services
Services
020 7404 9390
Available 24 hours
BOOK CONSULTATION WHATSAPP US MESSAGE US PHONE US

The Online Safety Act 2023

Eva Wallace
Eva Wallace R&P Solicitor

Current position

The Online Safety Bill became the Online Safety Act (OSA) when it received royal assent earlier this month.

The OSA is a comprehensive (and indeed, complex) piece of legislation, with the noble aim of creating a safer digital environment. Ofcom – the UKs telecoms regulator – is responsible for enforcing its provisions.

It seeks to change the current two-tier liability scheme when it comes to harmful online content. Until now, publishers of illegal content online (i.e., the users) were liable for disseminating it, but online platforms often had a means of avoiding responsibility.

Now, the OSA places the onus on platforms to take active steps to locate and remove harmful content from their sites, failing which, they could be subjected to hefty fines, as well as criminal sanctions for some senior managers if their company fails to comply with the legislation in specific circumstances..

Intermediary services, such as search engine services, can also be held liable under the act for enabling third-party content to be shared and disseminated online. Previously, such services were only liable if they were put on notice of the illegal content, and they failed to / refused to remove or de-index it. Essentially this shifts intermediary liability from being reactive (triggered by being put on notice of harmful content) to proactive (being required to put in place systems and processes to reduce the likelihood of harmful content appearing at all). This would be a huge sea change if the OSA is enforced as envisaged.

The new legislation covers user-to-user services, which include:

  • social media photo and video-sharing services;
  • instant messaging services;
  • online gaming services;
  • online search services; and
  • pornography websites.

Other sites, such as those of news organisations, and their comment sections are exempt from the OSAs remit.

Duties of care

Wide duties of care are placed on user-to-user services and search engines by the OSA, which focus on the procedures and systems service providers must have in place to ensure the safety of their users.

In practice, this means that service providers will need to conduct and publish risk assessments regarding harmful content on their sites and once identified, they will need to take steps to remove that content.

Three main categories of harmful content are covered:

  1. illegal content* (namely, content relating to certain criminal offences);
  2. content which is legal, but which is harmful to children; and
  3. fraudulent advertising.

*This wont result in a spike of defamation cases, however, as defamation does not fall into category (1) illegal content.

Failing to comply with the OSAs rules could result in companies facing fines of up to £18 million, or 10% of their global annual turnover (whichever is higher). Such companies CEOs and employees could also even be sentenced to jail time.

Ofcoms role

As above, Ofcom is in charge of enforcing the provisions of the Act and is publishing its new codes of practice in three phases.

It is given the power to impose fines of £18 million or 10% of a companys global turnover (whichever is the highest) if it finds the company has failed in their duty of care. It also has powers to seek court-approved cessation orders to serve on service providers that it feels are failing in their duties of care.

Ofcoms first round of guidance was published on the 9th of November, which explains: (1) who the rules apply to; (2) what they mean; (3) if they apply to a specific company and what they need to do; and (4) other important things companies should know now.

As anyone dealing with Ofcom knows, it is already overstretched, so this will be a significant addition to its responsibilities.

Controversies

The fact that the OSA expects bigger platforms (including social media providers and online search engines) to monitor potentially harmful, but not illegal, content on their sites (insofar as the content can be considered harmful to children) has caused some controversy amongst free-speech activists. Such critics have argued that the power awarded to online platforms to rigorously moderate online content may result in over-censorship and therefore stifle free speech and legitimate expression of opinions. As above, the penalties platforms could face for failure to comply with the act are significant; therefore, there is a concern that platforms will err on the side of caution.

Furthermore, by far the most contentious provision in the act is Section 122. This has been widely interpreted as requiring service providers (such as WhatsApp) to scan users messages in order to detect any potentially illegal content. The concern with this is that it risks breaking end-to-end encryption. This could have the effect of compromising users privacy and weakening encryptions protections so much so that it could risk communications becoming vulnerable to hackers. There are also significant technical issues with this requirement. Indeed, many big tech companies have argued that a way to scan encrypted messages without breaking encryption itself doesnt even exist. However, the Government has since stated that Ofcom will not use this power until it is actually possible to do so.

Going forward

At the moment, it is yet to be seen how this will all play out in practice. While its aims are noble, many industry experts are sceptical as to whether it can be effectively enforced in practice. Ofcom has an almost Herculean task ahead of it and much will depend upon the willingness of the online services providers to work with Ofcom on implementation.

There are still many questions regarding the duties of care imposed by the OSA. There will undoubtedly be a significant amount of work for compliance teams, lawyers and more to ensure clients are compliant with the new rules.

This is by no means the end of the road when it comes to online safety regulation. As we all know, technology continues to evolve, social media keeps growing, and new harms will keep coming out of the woodwork.

At the moment, it will be important for service providers to start familiarising themselves with the OSA and the duties of care imposed on them to ensure they are prepared for the changes to come.

This site uses cookies. Find out more. Continued use of this site is deemed as consent.   CLOSE ✖